Health insurance mandate

A health insurance mandate is either an employer or individual mandate to obtain private health insurance instead of (or in addition to) a national health insurance plan.[1]

  1. ^ D. Andrew Austin, Thomas L. Hungerford (2010). Market Structure of the Health Insurance Industry Congressional Research Service. Library of Congress.

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search